Open Access


Read more
image01

Online Manuscript Submission


Read more
image01

Submitted Manuscript Trail


Read more
image01

Online Payment


Read more
image01

Online Subscription


Read more
image01

Email Alert



Read more
image01

Original Research Article | OPEN ACCESS

Introducing a performance-based objective clinical examination into the pharmacy curriculum for students of Northern Cyprus

Abdikarim M Abdi1, Arijana Mestrovic1,2, Ilker Gelisen1, Onur Gultekin1, Dudu Ozkum Yavuz1, Sahan Saygı1, Hayder Al-Baghdadi1, Rumeysa Demirdamar3, Bilgen Basgut1

1Near East University, Faculty of Pharmacy, Nicosia, Northern Cyprus, Mersin10, Turkey; 2Department of Pharmacy, University of Split School of Medicine, Split, Croatia; 3European University of Lefke, Lefke, Northern Cyprus, Mersin10, Turkey.

For correspondence:-  Bilgen Basgut   Email: daud87@hotmail.com

Received: 12 October 2016        Accepted: 4 February 2017        Published: 31 March 2017

Citation: Abdi AM, Mestrovic A, Gelisen I, Gultekin O, Yavuz DO, Saygı S, et al. Introducing a performance-based objective clinical examination into the pharmacy curriculum for students of Northern Cyprus. Trop J Pharm Res 2017; 16(3):681-688 doi: 10.4314/tjpr.v16i3.25

© 2017 The authors.
This is an Open Access article that uses a funding model which does not charge readers or their institutions for access and distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0) and the Budapest Open Access Initiative (http://www.budapestopenaccessinitiative.org/read), which permit unrestricted use, distribution, and reproduction in any medium, provided the original work is properly credited..

Abstract

Purpose: To describe how a formative Objective Structured Clinical Examination was applied to fourth year pharmacy students at a university in Northern Cyprus.
Methods: A blueprint-guided performance-based objective clinical examination was implemented. Group-prepared case scenarios based on course objectives were used to develop 12 exam stations. Scenarios were discussed in common training sessions for both assessors (faculty members) and senior students (standardized patients). Pilot testing of all stations was carried out on the day of the examination. Competencies tested included medical history taking, pharmacotherapeutic knowledge application, systemic client assessment, evidence-based drug information (DI) manipulation, drug related problems (DRP) management, patient counseling and communication skills.
Results: The exam revealed that students were better in performing patient counseling (4.4 ± 0.23) and identification/resolution of DRPs (3.68 ± 0.18) than in DI tasks (2.00 ± 0.21) (p < 0.05). The students’ perceptions were positive with no significant differences in their average general performance compared to a written exam that had been previously carried out (p = 1.0).
Conclusion: The evaluation revealed that undergraduate pharmacy students in a Turkish school of pharmacy were better in performing patient counseling and identification/ resolution of DRPs than in drug information manipulation tasks. Student satisfaction with OSCEs was higher compared to the written examination. The design and implementation of the formative assessment was successful with minimum cost, using only the existing available space and personnel
 

Keywords: Objective structured clinical examination, Formative assessment, Pharmacy students assessment, Competency, Examination

Introduction

With the continuous evolution of advanced patient care services and practices, the need for reviewing and restructuring pharmacy education to ensure that outcomes reflect the needs of societies has arisen, both nationally and internationally.

Pharmacy undergraduate programs should prepare graduate pharmacists with the adequate knowledge, skills and attitudes to obtain roles in rational medication use and providing pharmaceutical care in a variety of settings, including in communities and hospitals. Core competencies to achieve that goal should be well-assessed and evaluated within curricula to provide accountability for the goals of pharmacy education [1,2]. Such competencies should be reviewed and regulated by both national and international pharmaceutical accreditation bodies such as the Accreditation Council of Pharmacy Education (ACPE) [1]. Driven by these bodies, standards and recommendations, many pharmacy schools in the US have adopted OSCE's as a primary tool for student competency assessment. Other pharmacy schools and colleges have preferred to use OSCEs within their curriculum so as to assess the integration of student knowledge, skills, and communication, compared to traditional methods of knowledge assessment [1].

Over 1,500 papers have been published that cover teaching or assessment using standardized patients which evaluate clinical knowledge, professional judgment, communication, interpersonal skills, problem-solving skills and resolution development [1,2]. During an OSCE, candidates are observed and evaluated as they go through a series of stations in which they interview, examine and treat standardized patients who present some type of medical problem. The hallway of OSCE exam rooms, each occupied by a uniquely challenging patient, is usually a familiar milieu to the candidate [1].

As reported by Sturpe in 2009, only 37 % of 87 sampled pharmacy schools in the United States were using OSCEs in their curricula, while others reported their plans to implement OSCEs in the near future. In addition, many pharmacy schools around the world have incorporated OSCEs in their curricula as an assessment tool in pharmacotherapy and other laboratory courses. OSCEs are also used in advanced pharmacy practice experience (APPE), yearly summative examinations, continuing education activities and in licensure processes in Canada, US, UK, Brazil, Japan & Malaysia as a primary component to assess problem based learning [1,2,4,10-12].

In Turkey, there are currently over 30 pharmacy schools with 2,082 new students enrolling annually for undergraduate education, according to the statistics of the Turkish Ministry of Higher Education [1]. However, though simulation activities were reported in few studies [1,2]; there is no evidence of OSCEs being implemented in the pharmacy education curricula at any of these schools. The barriers for adopting OSCEs that have been successfully implemented elsewhere in the world have never been studied in Turkey. Such barriers may include: costs; concerns about increased faculty workload; lack of faculty interest to try new assessment techniques; lack of access to standardized patient case studies; concerns about the validity and reliability of the technique compared to other assessment methods; and lack of space to conduct OSCE activities [1,2].

In this report, the authors describe how a pharmacy school in Northern Cyprus implemented a formative OSCE exam as a pilot study before adopting it as a formal "assessment for learning" method for the advanced pharmacy practice experience course. This project was initiated by faculty members during the process of international certification provided by the Accreditation Council for Pharmacy Education (ACPE) as a direct response to the ACPE International Quality Criteria for Certification of Professional Degree Programs in Pharmacy 2012 and FIP Global Framework for Quality Assurance in Pharmacy Education, launched in 2014 [1,2].

The aim of this research is to describe how a formative OSCE was developed and applied and to evaluate and quantify the performance of the students in the OSCEs in the areas of knowledge, clinical and social skills, as a pilot study before formally incorporating OSCEs into the curriculum and to explore the perceptions of the students on the introduction of OSCEs as a new assessment tool.

Methods

The OSCE was carried at Near East University (NEU) Faculty of Pharmacy. The Faculty’s academic program is a 5 year Master of Pharmacy (M. Pharm) degree program. OSCE methodology, purpose and structure were introduced to fourth year pharmacy students in Northern Cyprus seven days prior to the exam. A 13-station examination was prepared by the Faculty Assessment Committee (12 of them were clinical cases and one was a feedback station). Members of the Committee were initially trained at a two-day workshop on designing and implementing OSCE exams by an international lecturer. The committee was comprised of clinical preceptors (n = 5) and lecturing professors (n = 4).  A blueprint was developed to guide case scenario preparation, based on the material covered by the students in a clinical pharmacy course, which the students had taken in the preceding semester as stated in the course description. A committee of three persons was assigned to develop 12 station cases and one feedback station and then all were peer-reviewed. Suggested changes were incorporated into the final versions of all 13 stations.

A checklist with a grading scale for skills and knowledge evaluation was developed for all clinical cases. An examiner and a standardized patient were assigned to each station with a clinical case. Examiners were predominantly faculty members (70 %) and preceptors (30 %), while standardized patients included faculty members (33 %), postgraduate students (33 %), and some undergraduate fifth-year students (33 %).

Both examiners and standardized patients were introduced to the OSCE exam methodology and objectives three weeks prior to the exam. They were all included in the OSCE case scenario acting training session, held one week before the exam.

A verbal declaration of commitment to ensure exam confidentiality was provided by all patients and examiners, as written case scenarios were given to them three days prior to the exam. Each standardized patient and assessor was requested to assign 30 minutes on the training days for a training session on their respective case, so that they could be instructed on how to interact with different types of OSCE examinees. Participants were asked to role-play the cases during the training sessions and individual performances were reviewed and discussed by both patients and assessors. On the exam day, a pilot round was administered before students' enrollment to the exam. In addition, a brief pre-OSCE orientation was provided for all students. 

Twelve clinical cases were developed and paired into two sets of six cases, assessing comparable skills, knowledge or attitudes but within a different case scenario. This was done to maintain examination security and confidentiality without isolating the students for a long period of time. Students were grouped into two shifts; a morning or an afternoon shift. Each student was assigned to run one loop of six cases, followed by a feedback station.

Students were assessed on a range of skills with 29 people on the examining committee including examiners, standardized patients, timekeepers, waiting room respondents and a general coordinator. Minimal available resources and settings were used, including 12 rooms (offices and classrooms), while faculty members, postgraduates students and undergraduate volunteer students acted as standardized patients, which helped in reducing the exam costs to a negligible level.

After the end of the exam, the students’ perceptions were collected at the feedback station in an interview with two non-examiner committee members and two international education experts. Answers were immediately recorded in electronic format and interviews were not audiotaped. Students were questioned about their perception of the exam, case difficulties or ambiguities, standardized patient performance, the exam setting and timing, and whether they had enough pre-guidance. They also shared their opinions about the advantages and disadvantages regarding incorporating OSCE's regularly as a primary method for student clinical skills assessment.

Competencies evaluated in the OSCE included response to symptoms and history taking, pharmacotherapeutic knowledge, systems based client assessment, data retrieval and interpretation using an evidence-based approach, providing general health advice, clinical prescription management problems, patient counseling and communication skills. The Institutional Review Board at Near East University determined that the study did not require ethical review.

Statistical analysis

Statistical analysis was performed using GraphPad Prism (version 6.0). Student scores or grades were expressed as the mean ± S.E.M.  One-way ANOVA and Tukey’s tests were used for comparing student performance at each station. Fisher extract test was used for comparing student’s performance in written exam compared to OSCE. P < 0.05 was considered as statistically significant.

Results

A total of 77 out of 81 fourth-year students (95 %) participated in the OSCE exam. The highest mark achieved in the examination was 29/30 and the lowest was 3.5/30, with an average grade of 17/30. As shown in , the students scored their highest marks in Station 10 (hypertensive patient on atenolol with misconceptions about his medication, mean 4.4 ± SEM = 0.23), followed by Station 5 (pediatrics patient with URTI, mean 3.68 ± SEM = 0.18), significantly (p < 0.05).

The lowest grades were recorded at Station 8 (managing drug-drug interactions, 2.00 ± 0.21), followed by Station 4 (CVD risk assessment & providing medical information, 2.04 ± 0.22). When comparing average student performance for the same scenarios performed by different standardized patients and assessors, no significant differences (Question1a (Q1a) Vs Q1b p = 0.69; Q2a Vs Q2b p = 0.67; Q3a Vs Q3b p = 0.74) were seen in parallel simultaneous circuits in which faculty members were the assessors. However, significant differences were found in those in which the assessors were postgraduate students (Q4a Vs Q4b p = 0.0001; Q5a Vs Q5b p = 0.001; Q6a Vs Q6b p = 0.003).

Participant students in a semi-structured group interview (30 - minute sessions for each of the seven groups, n = 12 students/group) stated that they highly enjoyed the experience and felt that the exam resembled actual practice. This provided them with self-confidence, clarified more clearly their defects and also what they needed to improve regarding both their skills and knowledge. The groups interviewed all agreed with the need to review the educational curriculum, incorporating patient care focused courses with more OSCE assessments, which some groups saw as being a good assessing and teaching tool.

Students were generally satisfied with the knowledge and skill levels required in all cases. They saw that the cases had been realistic, while most groups were also satisfied with the pre-OSCE preparation and instructions. The average grading of the OSCE setting according to the students’ view was 9 out of 10, while five groups out of the seven felt that station 5, which assessed “The response to symptoms and medication history taking for a pediatric patient”, was the station they liked most. All seven groups agreed that the “drug information manipulation” station was the most difficult for them. All groups also agreed that the five-minute timeframe given for each station was enough, according to their experience. Student responses in the interviews are summarized in .

Discussion

According to the literature on OSCE settings and critiques, recommended measures to improve the validity and reliability of OSCEs include developing realistic case scenarios, station development guided by a blueprint that pre-defines exam domains to be assessed and all activities should be prepared by a team rather than an individual. Case scenarios should be peer reviewed and set with appropriate scoring rubrics to increase the examination reliability [16-18].

Prior training of standardized patients and examiners is critical to assure station consistency, as well as pilot testing of the OSCE stations just before the exam. An appropriate number of stations should also be developed to evaluate a wide range of competences and skills and also to reduce sampling errors.

There is no clear definition for the optimal number of independent assessments, however, between 12 - 16 stations is the amount commonly used in OSCEs for pharmacy licensing in the US and Canada [5,6].

In this OSCE exam, all mentioned practices were considered and successfully applied. This was largely attributable to the pre-training the OSCE team received from an international expert in pharmacy education. The exam was guided by a blueprint; cases were prepared by a group of clinical pharmacists and hospital preceptors with clinical experience. Standardized patients and examiners were well prepared for the OSCE settings and trained on role-playing in their specific case scenarios. A total of 13 stations were developed and students were divided into two groups; each group enrolled at a set of seven stations; the seventh being the feedback station.

Quality assurance procedures were carried out, including defining methods for reporting and managing errors, receiving announcements and scheduling break times.

No significant differences were noticed when comparing the average student scores in the OSCE and the subject written exam held a month prior, as shown in . Students considered OSCEs to be more reflective of their real knowledge and skills than the written exam had been. In the semi-structured interview, the students were extremely positive regarding the method, seeing it as an effective assessment method that could guide important skills that they required for further development.

The exam was highly efficient in terms of timing, facilities and cost; it did not require considerable financial resources and faculty time, which was the opposite of reports by Stowe and Gardner and others [19,20]. High costs and difficulties associated with development and implementation are generally the principle reason for the lack of development of OSCE’s in pharmacy education [4].

As two approaches were adopted to assure confidentiality without isolating students for a long period, The first approach of dividing students into two groups; a morning shift and an afternoon shift, in which each shift was assigned to a different set of cases; this approach prevented each group from being isolated for more than two hours; however, a significant difference was still noted in the average student score in the morning shift compared to the afternoon shift. This could be attributed to cases of varying difficulties or ambiguity in the two shifts that the exam committee had not anticipated. The second approach was to develop two exam arms with the same set of cases carried out parallel to each other. In this scenario, when comparing the average student scores in the individual stations compared to the parallel station, no significant differences were noted (Q1a Vs Q1b p = 0.69; Q2a Vs Q2b p = 0.67; Q3a Vs Q3b p = 0.74) in the stations assessed by faculty members. However, significant differences were noticed (Q4a Vs Q4b p = 0.0001; Q5a Vs Q5b p = 0.001; Q6a Vs Q6b p = 0.003) in those assessed by postgraduates. This may suggest the need for further training and consistency assurance for postgraduate examiners and standardized patients compared to faculty members. This would make the approach more preferable as it would allow more stations to operate concurrently, thereby reducing the student isolation time without affecting exam consistency.

Fourth year pharmacy students scored most highly in patient counseling tasks and identification/resolution of DRPs while they were significantly inferior in the drug information tasks and drug-drug interaction management. This was the same result that was reported in a study of OSCEs that was carried out on pharmacy students in Malaysia [12]. This suggests the need to reinforce such skills for Turkish students, as this is a crucial competency for pharmacy graduates and practitioners.

In assessing students’ perceptions and experience, it was notable that the attitude of Turkish students towards advancing and developing their clinical skills to cope with the global shift in pharmacy practice toward more clinical service providing profession was positive. The students suggested that more clinical courses and practice examples should be included in the current pharmacy curriculum. They perceived the OSCE to be a suitable assessment tool, expressing the wish to have more OSCEs during their time of study. Positive student attitudes towards more patient care based education along with the Turkish Parliament 2014 pharmacy legislation [13], which permits all hospitals and clinics to assign more positions for clinical pharmacists, both promote the need to review pharmacy education in Turkey and North Cyprus to ensure that graduates are sufficiently competent to provide advanced pharmaceutical care services.

The exam as a pilot study was highly feasible and successful and, to our knowledge, this was the first OSCE to be carried out for pharmacy students in either Turkish or North Cypriot pharmacy schools.

Limitations of the study

The main drawback of this examination was the inconsistency of cases between the two shifts, since not all students carried out the same tasks. Other methods could be considered in future in order to maintain the balance between consistency, confidentiality and student comfort.

Conclusion

The study findings show that undergraduate pharmacy students in a Turkish school of pharmacy perform better in patient counseling and identification/resolution of DRPs than in drug information tasks. This reveals the need to review and reinforce curricula to strengthen relevant areas in students’ knowledge and skills.  More emphasis should therefore be placed on drug information and literature interpretation. Students perceived the examination in a very positive way, requesting that additional clinical knowledge and practice be incorporated into the pharmacy curriculum. They also suggested that more exercises on solving cases and other problem-based learning approaches should be more prevalent in their program. It is therefore extremely important to invest in Turkish students’ positive perception on advancing pharmacy education in Turkey and Northern Cyprus, in order keep up to date with global practice demands and to shift to a more patient-centered profession and educational system. The authors will continue to follow the implementation of the program and students’ progress in future years.

ABBREVIATIONS

 

ACPE, Accreditation Council of Pharmacy Education; APPE, Advanced pharmacy practice experience; CVD, Cardiovascular Diseases; DI, Drug Information; DRP, Drug related problems; FIP, Pharmacists International Federation; MDI, Metered Dose Inhaler; OSCE, Objective Structured Clinical Examination; PDI, Powdered Dose Inhaler; Q 1 –Q 6, Questions 1 – Question 7; S.E.M, Standard Error of the Mean; UK, United Kingdom; URTI, Upper Respiratory Tract Infection; US, United States of America.

Declarations

Acknowledgement

The authors express their appreciation to Mike Rose from ACPE for sharing his experience that greatly assisted the study, and are also thankful to their colleagues in Near East University Faculty of Pharmacy for their technical support during study implementation.

References

  1. International Pharmaceutical Federation. Statement of Policy on Good Pharmacy Education Practice [approved by FIP Council in Vienna in September 2000; accessed 2015 Mar 23]. Available from: http://www.fip.org/ www/?page=statements 
  2. International Pharmaceutical Federation. Quality Assurance of Pharmacy Education: the FIP Global Framework. 2nd Ed [ approved by FIP in Hague, 2014; accessed 2015 Mar 23] Available from: https://www.fip.org/files/fip/PharmacyEducation/Quality_Assurance/QA_Framework_2nd_Edition_online_version.pdf
  3. Accreditation Council for Pharmacy Education. Accreditation standards of 2016 [approved in January 25, 2015; Accessed 2015 Mar 23] Available from:   http://www.acpe-accredit.org/deans/standards.asp.
  4. DA. Objective structured clinical examinations in doctor of pharmacy programs in the United States. Am J Pharm Educ. 2010; 74(8): 148.
  5. Austin Z, O’Byrne C, Pugsley J, Quero Munoz L. Development and validation processes for an objective structured clinical examination (OSCE) for entry-to-practice certification in pharmacy: the Canadian experience. Am J Pharm Educ. 2003; 67(3): 76.
  6. Harden RM. What is an OSCE? Med Teach. 1988; 10(1): 19-22.
  7. Gelula MH, Yudkowsky. Microteaching and standardized students support faculty development for clinical teaching. R Acad Med. 2002; 77(9): 941.
  8. Sloan DA, Donnelly MB, Schwartz RW, Felts JL, Blue AV, Strodel WE. The use of objective structured clinical examination (OSCE) for evaluation and instruction in graduate medical education. J Surg Res. 1996; 63(1): 225-230
  9. Kirton SB, Kravitz L. Objective Structured Clinical Examinations (OSCEs) compared with traditional assessment methods. Am J Pharm Educ. 2011; 10; 75(6): 111.
  10. Eğitim Platformu. Eczacılık Fakültesi Taban Puanlları ve kontenjanları. [accessed 2015 Nov 04] Available from:  http://www.derszamani.net/eczacilik-fakultesi-tabanpuanlari.html.
  11. Toklu HZ. Problem based pharmacotherapy teaching for pharmacy students and pharmacists. Curr Drug Deliv. 2013;10(1): 67-70.
  12. Toklu H, Demirdamar R. "The evaluation of prescription dispensing scores of the pharmacy students before and after the problem-based “rational drug use” course: Results of the two years’ experience". Marm Pharm J.  2013; 17(3): 175-180
  13. McAleer S, Walker R. Objective structured clinical examination (OSCE). Occas Pap R Coll Gen Pract. 1990; 46: 39-42.
  14. Harden RM. Twelve tips for organizing an objective structured clinical examination (OSCE). Med Teach. 1990; 12(3-4): 259-264.
  15. Accreditation Council for Pharmacy Education. International Quality Criteria for Certification of Professional Degree Programs in Pharmacy, [adopted by ACPE International Services Program in 2012 June 20; Accessed 2015 Mar 23] Available from:  https://www.acpe-accredit.org/pdf/ACPE_InternationalQualityCriteria_06202012.pdf.
  16. Stowe CD, Gardner SF. Real-Time Standardized Participant Grading of an Objective Structured Clinical Examination. Am J Pharm Educ. 2005; 69(3): 41.
Impact Factor
Thompson Reuters (ISI): 0.523 (2021)
H-5 index (Google Scholar): 39 (2021)

Article Tools

Share this article with



Article status: Free
Fulltext in PDF
Similar articles in Google
Similar article in this Journal:

Archives

2024; 23: 
1,   2,   3
2023; 22: 
1,   2,   3,   4,   5,   6,   7,   8,   9,   10,   11,   12
2022; 21: 
1,   2,   3,   4,   5,   6,   7,   8,   9,   10,   11,   12
2021; 20: 
1,   2,   3,   4,   5,   6,   7,   8,   9,   10,   11,   12
2020; 19: 
1,   2,   3,   4,   5,   6,   7,   8,   9,   10,   11,   12
2019; 18: 
1,   2,   3,   4,   5,   6,   7,   8,   9,   10,   11,   12
2018; 17: 
1,   2,   3,   4,   5,   6,   7,   8,   9,   10,   11,   12
2017; 16: 
1,   2,   3,   4,   5,   6,   7,   8,   9,   10,   11,   12
2016; 15: 
1,   2,   3,   4,   5,   6,   7,   8,   9,   10,   11,   12
2015; 14: 
1,   2,   3,   4,   5,   6,   7,   8,   9,   10,   11,   12
2014; 13: 
1,   2,   3,   4,   5,   6,   7,   8,   9,   10,   11,   12
2013; 12: 
1,   2,   3,   4,   5,   6
2012; 11: 
1,   2,   3,   4,   5,   6
2011; 10: 
1,   2,   3,   4,   5,   6
2010; 9: 
1,   2,   3,   4,   5,   6
2009; 8: 
1,   2,   3,   4,   5,   6
2008; 7: 
1,   2,   3,   4
2007; 6: 
1,   2,   3,   4
2006; 5: 
1,   2
2005; 4: 
1,   2
2004; 3: 
1
2003; 2: 
1,   2
2002; 1: 
1,   2

News Updates